Convex Synthesis of Accelerated Gradient Algorithms

نویسندگان

چکیده

Related DatabasesWeb of Science You must be logged in with an active subscription to view this.Article DataHistorySubmitted: 12 February 2021Accepted: 31 May 2021Published online: 14 December 2021Keywordsoptimization algorithms, robust control, algorithm synthesisAMS Subject Headings93D09, 93D15, 93D25, 90C22, 90C25Publication DataISSN (print): 0363-0129ISSN (online): 1095-7138Publisher: Society for Industrial and Applied MathematicsCODEN: sjcodc

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Accelerated gradient sliding for structured convex optimization

Our main goal in this paper is to show that one can skip gradient computations for gradient descent type methods applied to certain structured convex programming (CP) problems. To this end, we first present an accelerated gradient sliding (AGS) method for minimizing the summation of two smooth convex functions with different Lipschitz constants. We show that the AGS method can skip the gradient...

متن کامل

Gradient algorithms for polygonal approximation of convex contours

The subjects of this paper are descent algorithms to optimally approximate a strictly convex contour with a polygon. This classic geometric problem is relevant in interpolation theory and data compression, and has potential applications in robotic sensor networks. We design gradient descent laws for intuitive performance metrics such as the area of the inner, outer, and “outer minus inner” appr...

متن کامل

Better Mini-Batch Algorithms via Accelerated Gradient Methods

Mini-batch algorithms have been proposed as a way to speed-up stochastic convex optimization problems. We study how such algorithms can be improved using accelerated gradient methods. We provide a novel analysis, which shows how standard gradient methods may sometimes be insufficient to obtain a significant speed-up and propose a novel accelerated gradient algorithm, which deals with this defic...

متن کامل

New Accelerated Conjugate Gradient Algorithms for Unconstrained Optimization

New accelerated nonlinear conjugate gradient algorithms which are mainly modifications of the Dai and Yuan’s for unconstrained optimization are proposed. Using the exact line search, the algorithm reduces to the Dai and Yuan conjugate gradient computational scheme. For inexact line search the algorithm satisfies the sufficient descent condition. Since the step lengths in conjugate gradient algo...

متن کامل

Conditional gradient algorithms for norm-regularized smooth convex optimization

Motivated by some applications in signal processing and machine learning, we consider two convex optimization problems where, given a cone K , a norm ‖ · ‖ and a smooth convex function f , we want either (1) to minimize the norm over the intersection of the cone and a level set of f , or (2) to minimize over the cone the sum of f and a multiple of the norm. We focus on the case where (a) the di...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Siam Journal on Control and Optimization

سال: 2021

ISSN: ['0363-0129', '1095-7138']

DOI: https://doi.org/10.1137/21m1398598